A novel measure for independent component analysis (ICA)
نویسندگان
چکیده
Measures of independence (and dependence) are fundamental in many areas of engineering and signal processing. Shannon introduced the idea of Information Entropy which has a sound theoretical foundation but sometimes is not easy to implement in engineering applications. In this paper, Renyi’s Entropy is used and a novel independence measure is proposed. When integrated with a nonparametric estimator of the probability density function (Parzen Window), the measure can be related to the “potential energy of the samples” which is easy to understand and implement. The experimental results on Blind Source Separation confirm the theory. Although the work is preliminary, the “potential energy” method is rather general and will have many applications.
منابع مشابه
A PCA/ICA based Fetal ECG Extraction from Mother Abdominal Recordings by Means of a Novel Data-driven Approach to Fetal ECG Quality Assessment
Background: Fetal electrocardiography is a developing field that provides valuable information on the fetal health during pregnancy. By early diagnosis and treatment of fetal heart problems, more survival chance is given to the infant.Objective: Here, we extract fetal ECG from maternal abdominal recordings and detect R-peaks in order to recognize fetal heart rate. On the next step, we find a be...
متن کاملSpeech enhancement based on hidden Markov model using sparse code shrinkage
This paper presents a new hidden Markov model-based (HMM-based) speech enhancement framework based on the independent component analysis (ICA). We propose analytical procedures for training clean speech and noise models by the Baum re-estimation algorithm and present a Maximum a posterior (MAP) estimator based on Laplace-Gaussian (for clean speech and noise respectively) combination in the HMM ...
متن کاملEstimating Squared-Loss Mutual Information for Independent Component Analysis
Accurately evaluating statistical independence among random variables is a key component of Independent Component Analysis (ICA). In this paper, we employ a squared-loss variant of mutual information as an independence measure and give its estimation method. Our basic idea is to estimate the ratio of probability densities directly without going through density estimation, by which a hard task o...
متن کاملReliability in ICA-Based Text Classification
This paper introduces a novel approach for improving the reliability of ICA-based text classifiers, attempting to make the most of the independent component analysis data. In this framework, two issues are adressed: firstly, a relative relevance measure for category assignation is presented. And secondly, a reliability control process is included in the classifier, avoiding the classification o...
متن کاملA review on EEG based brain computer interface systems feature extraction methods
The brain – computer interface (BCI) provides a communicational channel between human and machine. Most of these systems are based on brain activities. Brain Computer-Interfacing is a methodology that provides a way for communication with the outside environment using the brain thoughts. The success of this methodology depends on the selection of methods to process the brain signals in each pha...
متن کاملPerformance comparison of new nonparametric independent component analysis algorithm for different entropic indexes
Most independent component analysis (ICA) algorithms use mutual information (MI) measures based on Shannon entropy as a cost function, but Shannon entropy is not the only measure in the literature. In this paper, instead of Shannon entropy, Tsallis entropy is used and a novel ICA algorithm, which uses kernel density estimation (KDE) for estimation of source distributions, is proposed. KDE is di...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998